- Home
- Search Results
- Page 1 of 1
Search for: All records
-
Total Resources5
- Resource Type
-
0001200002000000
- More
- Availability
-
50
- Author / Contributor
- Filter by Author / Creator
-
-
Chen, Jiuhai (5)
-
Goldstein, Tom (3)
-
Chen, Lichang (2)
-
Huang, Heng (2)
-
Kang, Lulu (2)
-
Zhou, Tianyi (2)
-
Bruss, C Bayan (1)
-
Catanzaro, Bryan (1)
-
Kichenbauer, John (1)
-
Kong, Kezhi (1)
-
Lin, Guang (1)
-
Liu, Chun (1)
-
Ni, Renkun (1)
-
Shoeybi, Mohammad (1)
-
Soselia, Davit (1)
-
Wang, Yiwei (1)
-
Zhu, Chen (1)
-
#Tyler Phillips, Kenneth E. (0)
-
#Willis, Ciara (0)
-
& Abreu-Ramos, E. D. (0)
-
- Filter by Editor
-
-
null (2)
-
& Spizer, S. M. (0)
-
& . Spizer, S. (0)
-
& Ahn, J. (0)
-
& Bateiha, S. (0)
-
& Bosch, N. (0)
-
& Brennan K. (0)
-
& Brennan, K. (0)
-
& Chen, B. (0)
-
& Chen, Bodong (0)
-
& Drown, S. (0)
-
& Ferretti, F. (0)
-
& Higgins, A. (0)
-
& J. Peters (0)
-
& Kali, Y. (0)
-
& Ruiz-Arias, P.M. (0)
-
& S. Spitzer (0)
-
& Sahin. I. (0)
-
& Spitzer, S. (0)
-
& Spitzer, S.M. (0)
-
-
Have feedback or suggestions for a way to improve these results?
!
Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Chen, Lichang; Zhu, Chen; Chen, Jiuhai; Soselia, Davit; Zhou, Tianyi; Goldstein, Tom; Huang, Heng; Shoeybi, Mohammad; Catanzaro, Bryan (, Forty-first International Conference on Machine Learning (ICML 2024))
-
Kong, Kezhi; Chen, Jiuhai; Kichenbauer, John; Ni, Renkun; Bruss, C Bayan; Goldstein, Tom (, Proceedings of Machine Learning Research)Graph transformers have been competitive on graph classification tasks, but they fail to outperform Graph Neural Networks (GNNs) on node classification, which is a common task performed on large-scale graphs for industrial applications. Meanwhile, existing GNN architectures are limited in their ability to perform equally well on both homophilious and heterophilious graphs as their inductive biases are generally tailored to only one setting. To address these issues, we propose GOAT, a scalable global graph transformer. In GOAT, each node conceptually attends to all the nodes in the graph and homophily/heterophily relationships can be learnt adaptively from the data. We provide theoretical justification for our approximate global self-attention scheme, and show it to be scalable to large-scale graphs. We demonstrate the competitiveness of GOAT on both heterophilious and homophilious graphs with millions of nodes.more » « less
-
Chen, Jiuhai; Kang, Lulu; Lin, Guang (, Technometrics)null (Ed.)
-
Wang, Yiwei; Chen, Jiuhai; Liu, Chun; Kang, Lulu (, Statistics and Computing)null (Ed.)
An official website of the United States government

Full Text Available